2,064 research outputs found
Copula Correlation: An Equitable Dependence Measure and Extension of Pearson's Correlation
In Science, Reshef et al. (2011) proposed the concept of equitability for
measures of dependence between two random variables. To this end, they proposed
a novel measure, the maximal information coefficient (MIC). Recently a PNAS
paper (Kinney and Atwal, 2014) gave a mathematical definition for equitability.
They proved that MIC in fact is not equitable, while a fundamental information
theoretic measure, the mutual information (MI), is self-equitable. In this
paper, we show that MI also does not correctly reflect the proportion of
deterministic signals hidden in noisy data. We propose a new equitability
definition based on this scenario. The copula correlation (Ccor), based on the
L1-distance of copula density, is shown to be equitable under both definitions.
We also prove theoretically that Ccor is much easier to estimate than MI.
Numerical studies illustrate the properties of the measures
Alternative Justifications for Academic Support III: An Empirical Analysis of the Impact of Academic Support on Perceived Autonomy Support and Humanizing Law Schools
This article details the findings of a two-year empirical study on the impact of a law school academic support program (ASP) on law students. The hypothesis of the study was that as students\u27 participation in a well-resourced, open-access ASP increases, students\u27 perception of autonomy support and humanizing grows as well. The study concludes, based upon statistically significant data, that law school ASPs impact students in positive ways and therefore are worth the investment. This article is the third in a series designed to show that law school academic support measures positively impact students\u27 well-being and lead to a more robust educational experience
Granger causality and transfer entropy are equivalent for Gaussian variables
Granger causality is a statistical notion of causal influence based on
prediction via vector autoregression. Developed originally in the field of
econometrics, it has since found application in a broader arena, particularly
in neuroscience. More recently transfer entropy, an information-theoretic
measure of time-directed information transfer between jointly dependent
processes, has gained traction in a similarly wide field. While it has been
recognized that the two concepts must be related, the exact relationship has
until now not been formally described. Here we show that for Gaussian
variables, Granger causality and transfer entropy are entirely equivalent, thus
bridging autoregressive and information-theoretic approaches to data-driven
causal inference.Comment: In review, Phys. Rev. Lett., Nov. 200
Multivariate Granger Causality and Generalized Variance
Granger causality analysis is a popular method for inference on directed
interactions in complex systems of many variables. A shortcoming of the
standard framework for Granger causality is that it only allows for examination
of interactions between single (univariate) variables within a system, perhaps
conditioned on other variables. However, interactions do not necessarily take
place between single variables, but may occur among groups, or "ensembles", of
variables. In this study we establish a principled framework for Granger
causality in the context of causal interactions among two or more multivariate
sets of variables. Building on Geweke's seminal 1982 work, we offer new
justifications for one particular form of multivariate Granger causality based
on the generalized variances of residual errors. Taken together, our results
support a comprehensive and theoretically consistent extension of Granger
causality to the multivariate case. Treated individually, they highlight
several specific advantages of the generalized variance measure, which we
illustrate using applications in neuroscience as an example. We further show
how the measure can be used to define "partial" Granger causality in the
multivariate context and we also motivate reformulations of "causal density"
and "Granger autonomy". Our results are directly applicable to experimental
data and promise to reveal new types of functional relations in complex
systems, neural and otherwise.Comment: added 1 reference, minor change to discussion, typos corrected; 28
pages, 3 figures, 1 table, LaTe
Simpler, Faster, and More Robust T-test Based Leakage Detection
The TVLA procedure using the t-test has become a popular
leakage detection method. To protect against environmental fluctuation
in laboratory measurements, we propose a paired t-test to improve
the standard procedure. We take advantage of statistical matched-pairs
design to remove the environmental noise effect in leakage detection.
Higher order leakage detection is further improved with a moving average
method. We compare the proposed test with standard t-test on synthetic data and physical measurements. Our results show that the proposed tests are robust to environmental noise
High pressure evolution of FeO electronic structure revealed by X-ray absorption
We report the first high pressure measurement of the Fe K-edge in hematite
(FeO) by X-ray absorption spectroscopy in partial fluorescence yield
geometry. The pressure-induced evolution of the electronic structure as
FeO transforms from a high-spin insulator to a low-spin metal is
reflected in the x-ray absorption pre-edge. The crystal field splitting energy
was found to increase monotonically with pressure up to 48 GPa, above which a
series of phase transitions occur. Atomic multiplet, cluster diagonalization,
and density-functional calculations were performed to simulate the pre-edge
absorption spectra, showing good qualitative agreement with the measurements.
The mechanism for the pressure-induced phase transitions of FeO is
discussed and it is shown that ligand hybridization significantly reduces the
critical high-spin/low-spin pressure.Comment: 5 pages, 4 figures and 1 tabl
Towards Secure Cryptographic Software Implementation Against Side-Channel Power Analysis Attacks
Side-channel attacks have been a real threat against many critical embedded systems that rely on cryptographic algorithms as their security engine. A commonly used algorithmic countermeasure, random masking, incurs large execution delay and resource overhead. The other countermeasure, operation shuffling or permutation, can mitigate side-channel leakage effectively with minimal overhead. In this paper, we target utilizing the independence among operations in cryptographic algorithms and randomizing their execution order. We design a tool to automatically detect such independence between statements at the source code level and devise an algorithm for automatic operation shuffling. We test our algorithm on the new SHA3 standard, Keccak. Results show that the tool has effectively implemented operation-shuffling to reduce the side-channel leakage significantly, and therefore can guide automatic secure cryptographic software implementations against differential power analysis attacks
Differential Fault Analysis of SHA3-224 and SHA3-256
The security of SHA-3 against different kinds of attacks are of vital importance for crypto systems with SHA-3 as the security engine. In this paper, we look into the differential fault analysis of SHA-3, and this is the first work to conquer SHA3-224 and SHA3-256 using differential fault analysis. Comparing with one existing related work, we relax the fault models and make them realistic for different implementation architectures. We analyze fault propagation in SHA-3 under such single-byte fault models, and propose to use fault signatures at the observed output for analysis and secret retrieval. Results show that the proposed method can effectively identify the injected single-byte faults, and then recover the whole internal state of the input of last round operation () for both SHA3-224 and SHA3-256
An Improvement of Both Security and Reliability for Keccak Implementations on Smart Card
As the new SHA-3 standard, the security and reliability of Keccak have attracted a lot of attentions. Previous works already show that both software and hardware implementations of Keccak have strong side-channel power (electromagnetic) leakages, and these leakages can be easily used by attackers to recover secret key bits. Meanwhile, Keccak is vulnerable to random errors and injected faults, which will cause errors in the computation results. In this paper, we introduce a scheme based on the round rotation invariance property of Keccak to reduce the side-channel leakages while improve its reliability. The proposed scheme is resource friendly. Side-channel analysis results show that this method can efficiently reduce the side-channel leakages of Keccak implementations. Meanwhile, fault injection simulation results show that the proposed scheme can effectively improve the reliability of Keccak implementation, with error coverage almost 100%
- …